skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Search for: All records

Creators/Authors contains: "Soto, Pedro"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Free, publicly-accessible full text available January 1, 2027
  2. Free, publicly-accessible full text available November 24, 2025
  3. Differential equation models are crucial to scientific processes across many disciplines, and the values of model parameters are important for analyzing the behaviour of solutions. Identifying these values is known as a parameter estimation, a type of inverse problem, which has applications in areas that include industry, finance and biomedicine. A parameter is called globally identifiable if its value can be uniquely determined from the input and output functions. Checking the global identifiability of model parameters is a useful tool when exploring the well-posedness of a given model. This problem has been intensively studied for ordinary differential equation models, where theory, several efficient algorithms and software packages have been developed. A comprehensive theory for PDEs has hitherto not been developed due to the complexity of initial and boundary conditions. Here, we provide theory and algorithms, based on differential algebra, for testing identifiability of polynomial PDE models. We showcase this approach on PDE models arising in the sciences. 
    more » « less
    Free, publicly-accessible full text available January 30, 2026
  4. Coded distributed computation has become common practice for performing gradient descent on large datasets to mitigate stragglers and other faults. This paper proposes a novel algorithm that encodes the partial derivatives themselves and furthermore optimizes the codes by performing lossy compression on the derivative codewords by maximizing the information contained in the codewords while minimizing the information between the codewords. The utility of this application of coding theory is a geometrical consequence of the observed fact in optimization research that noise is tolerable, sometimes even helpful, in gradient descent based learning algorithms since it helps avoid overfitting and local minima. This stands in contrast with much current conventional work on distributed coded computation which focuses on recovering all of the data from the workers. A second further contribution is that the low-weight nature of the coding scheme allows for asynchronous gradient updates since the code can be iteratively decoded; i.e., a worker’s task can immediately be updated into the larger gradient. The directional derivative is always a linear function of the direction vectors; thus, our framework is robust since it can apply linear coding techniques to general machine learning frameworks such as deep neural networks. 
    more » « less
  5. null (Ed.)
  6. null (Ed.)
  7. null (Ed.)